227 research outputs found

    Modeling Blank Data Entries in Data Envelopment Analysis

    Get PDF
    We show how Data Envelopment Analysis (DEA) can handle missing data. When blank data entries are coded by appropriate dummy values, the DEA model automatically excludes the missing data from the analysis. We extend this result to weight-restricted DEA models by presenting a simple modification to the usual weight restrictions, which automatically relaxes the weight restriction in case of missing data. Our approach is illustrated by a case study, describing an application to international sustainable development indices.Data Envelopment Analysis, Weight Restrictions, Missing Data, Blank Entries

    Valuing Environmental Factors in Cost-Benefit Analysis Using Data Envelopment Analysis

    Get PDF
    Environmental cost-benefit analysis (ECBA) refers to social evaluation of investment projects and policies that involve significant environmental impacts. Valuation of the environmental impacts in monetary terms forms one of the critical steps in ECBA. We propose a new approach for environmental valuation within ECBA framework that is based on data envelopment analysis (DEA) and does not demand any price estimation for environmental impacts using traditional revealed or stated preference methods. We show that DEA can be modified to the context of CBA by using absolute shadow prices instead of traditionally used relative prices. We also discuss how the approach can be used for sensitive analysis which is an important part of ECBA. We illustrate the application of the DEA approach to ECBA by means of a hypothetical numerical example where a household considers investment to a new sport utility vehicle.Cost-Benefit Analysis, Data Envelopment Analysis, Eco-Efficiency, Environmental Valuation, Environmental Performance, Performance Measurement

    Stochastic Dominance Efficiency Tests under Diversification

    Get PDF
    This paper focuses on Stochastic Dominance (SD) efficiency in a finite empirical panel data. We analytically characterize the sets of unsorted time series that dominate a given evaluated distribution by the First, Second, and Third order SD. Using these insights, we develop simple Linear Programming and 0-1 Mixed Integer Linear Programming tests of SD efficiency. The advantage to the earlier efficiency tests is that the proposed approach explicitly accounts for diversification. Allowing for diversification can both improve the power of the empirical SD tests, and enable SD based portfolio optimization. A simple numerical example illustrates the SD efficiency tests. Discussion on the application potential and the future research directions concludes.Stochastic Dominance, Protfolio Choice, Efficiency, Diversification, Mathematical Programming

    On the Anatomy of Productivity Growth: A Decomposition of the Fisher Ideal TFP Index

    Get PDF
    Decompositions of productivity indices contribute to our understanding of what drives the observed productivity changes by providing a detailed picture of their constituents. This paper presents the most comprehensive decomposition of total factor productivity (TFP) to date. Starting from the Fisher ideal TFP index, we systematically isolate the productivity effects of changes in production technology, technical efficiency, scale efficiency, allocative efficiency, and the market strength. The three efficiency components further decompose into input- and output-side effects. The proposed decomposition is illustrated with an empirical application to a sample of 459 Finnish farms over period 1992-2000.index numbers and aggregation, Total Factor Productivity (TFP) measurement, Fisher ideal index, Malmquist index, decompositions, agriculture

    (In)Efficient management of interacting environmental bads

    Get PDF
    Many environmental problems involve the transformation of multiple harmful substances into one or more damage agents much in the same way as a firm transforms inputs into outputs. Yet environmental management differs from a firm's production in one important respect: while a firm seeks efficient input allocation to maximize profit, an environmental planner allocates abatement efforts to render the production of damage agents as inefficient as possible. We characterize a solution to the hmultiple pollutants problem and show that the optimal policy is often a corner solution, in which abatement is focused on a single pollutant. Corner solutions may arise even in well-behaved problems with concave production functions and convex damage and cost functions. Furthermore, even concentrating on a wrong pollutant may yield greater net benefits than setting uniform abatement targets for a harmful substances. Our general theoretical results on the management of flow and stock pollutants are complemented by two numerical examples illustrating the abatement of eutrophying nutrients and greenhouse gases

    The Law of One Price in Data Envelopment Analysis: Restricting Weight Flexibility Across Firms

    Get PDF
    The Law of One Price (LoOP) states that all firms face the same prices for their inputs and outputs in the competitive market equilibrium. This law has powerful implications for productive efficiency analysis, which have remained unexploited thus far. This paper shows how LoOP-based weight restrictions can be incorporated in Data Envelopment Analysis (DEA). Utilizing the relation between the industry level and the firm level cost efficiency measures, we propose to apply a set of input prices that is common for all firms and that maximizes cost efficiency of the industry. Our framework allows for firm-specific output weights and variable returns-to-scale, and preserves the linear programming structure of the standard DEA. We apply the proposed methodology for evaluating research efficiency of economics departments of Dutch Universities. This application shows that the methodology is computationally tractable for practical efficiency analysis, and that it helps in deepening the DEA analysis.Data Envelopment Analysis; Law of One Price; industry-level efficiency; weight restrictions; research efficiency

    The role of benchmark technology in sustainable value analysis

    Get PDF
    Sustainable Value Analysis (SVA) [F. Figge, T. Hahn, Ecol. Econ. 48(2004) 173-187] is a method formeasuring sustainability performance consistent with the constant capital rule and strongsustainability. SVA compares eco-efficiency of a firm relative to some benchmark. The choice of thebenchmark implies some assumptions regarding the underlying production technology. This paperpresents a rigorous examination of the role of benchmark technology in SVA. We show that Figge andHahn’s formula for calculating sustainable value implies a peculiar linear benchmark technology. Wepresent a generalized formulation of sustainable value that is not restricted to any particular functionalform and allows for estimating benchmark technology from empirical data. Our generalized SVAformulation reveals a direct link between SVA and frontier approaches to environmental performancemeasurement and facilitates statistical hypotheses testing concerning the benchmark

    Role of benchmark technology in sustainable value analysis : an application to Finnish dairy farms

    Get PDF
    Sustainability is a multidimensional concept that entails economic, environmental, and social aspects. The sustainable value (SV) method is one of the most promising attempts to quantify sustainability performance of firms. SV compares performance of a firm to a benchmark, which must be estimated in one way or another. This paper examines alternative parametric and nonparametric methods for estimating the benchmark technology from empirical data. Reviewed methods are applied to an empirical data of 332 Finnish dairy farms. The application reveals four interesting conclusions. First, the greater flexibility of the nonparametric methods is evident from the better empirical fit. Second, negative skewness of the regression residuals of both parametric OLS and nonparametric CNLS speaks against the average-practice benchmark technology in this application. Third, high positive correlations across a wide spectrum of methods suggest that the findings are relatively robust. Forth, the stochastic decomposition of the disturbance term to filter out the noise component from the inefficiency term yields more realistic efficiency estimates and performance targets

    Neoclassical versus frontier production models? Testing for the skewness of regression residuals

    Get PDF
    The empirical literature on production and cost functions is divided into two strands: 1) the neoclassical approach that concentrates on model parameters, 2) the frontier approach that decomposes the disturbance term to a symmetric noise term and a positively skewed inefficiency term. We propose a theoretical justification for the skewness of the inefficiency term, arguing that this skewness is the key testable hypothesis of the frontier approach. We propose to test the regression residuals for skewness to distinguish the two competing approaches. Our test builds directly upon the asymmetry of regression residuals and does not require any prior distributional assumptions.Firms and production; Frontier estimation; Hypotheses testing; Production function; Productive efficiency analysis
    corecore